• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) »ç°Ç ´Ü¾î ÁÖÀÇ ÁýÁß ¸ÞÄ¿´ÏÁòÀ» Àû¿ëÇÑ ´ÜÀÏ ¹®Àå ¿ä¾à »ý¼º
¿µ¹®Á¦¸ñ(English Title) Single Sentence Summarization with an Event Word Attention Mechanism
ÀúÀÚ(Author) Á¤À̾Ƞ  ÃÖ¼öÁ¤   ¹Ú¼¼¿µ   Ian Jung   Su Jeong Choi   Seyoung Park  
¿ø¹®¼ö·Ïó(Citation) VOL 47 NO. 02 PP. 0155 ~ 0161 (2020. 02)
Çѱ۳»¿ë
(Korean Abstract)
¿ä¾àÀ̶õ ÀÚ¿¬¾î ó¸® ¿¬±¸ ºÐ¾ß Áß Çϳª·Î, ÀÔ·ÂÀ¸·Î ÁÖ¾îÁø Á¤º¸ Áß Áß¿äÇÑ ³»¿ëÀº À¯ÁöÇϸ鼭 ¹®ÀåÀ» ª°Ô ¸¸µå´Â ŽºÅ©ÀÌ´Ù. ±× Áß ´ÜÀÏ ¹®ÀåÀ» ´ë»óÀ¸·Î ÇÑ ¿ä¾à ¿¬±¸ Áß ÀÔ·Â ¹®ÀåÀÇ ´Ü¾î¸¦ ¿ä¾à ¹®Àå¿¡ »ç¿ëÇÒÁö ¹ö¸±Áö¸¦ ÀÌÁø ºÐ·ùÇÏ¿© ´Ü¾î¸¦ ÃßÃâÇÏ¿© ¿ä¾àÀ» ¼öÇàÇÏ´Â ¹æ¹ý°ú ÀÔ·Â ¹®ÀåÀ» ±â¹ÝÀ¸·Î ¿ä¾à ¹®ÀåÀ» »ý¼ºÇÏ´Â ¹æ¹ýÀÌ ÀÖ´Ù. ±âÁ¸ÀÇ ÃßÃâ ¿ä¾à ¿¬±¸µéÀº ´Ü¾îÀÇ ±¸Á¶Àû Á¤º¸¸¦ »ç¿ëÇÏ¿© ÀÌÁø ºÐ·ù¸¦ ¼öÇàÇÏ¿´°í, ¹®ÀåÀ» »ý¼ºÇÏ´Â ¹æ¹ýµéÀº ¼øȯ½Å°æ¸ÁÀ» ÀÌ¿ëÇÏ¿© ¿ä¾à ¹®ÀåÀ» »ý¼ºÇÏ¿´´Ù. ÇÏÁö¸¸ ÀÌ·¯ÇÑ Á¢±Ù ¹æ¹ýÀº Áß¿äÇÑ Á¤º¸¸¦ ´©¶ôÇÏ°í ºÒÇÊ¿äÇÑ Á¤º¸·Î ¿ä¾àÀ» »ý¼ºÇÏ´Â ¹®Á¦°¡ ÀÖ´Ù. µû¶ó¼­ º» ³í¹®¿¡¼­´Â ¹«¾ùÀ» ÇàÇÏ¿´´ÂÁö¿¡ ´ëÇÑ Á¤º¸¸¦ Á¦°øÇÒ ¼ö ÀÖ´Â »ç°Ç ´Ü¾î¸¦ »ç¿ëÇÏ¿©, Áß¿äÇÑ Á¤º¸¿¡ ÁýÁßÇÏ¿© ¿ä¾àÀ» ¼öÇàÇÒ ¼ö ÀÖµµ·Ï »ç°Ç ´Ü¾î ÁÖÀÇÁýÁß ¸ÞÄ¿´ÏÁòÀ» Á¦¾ÈÇÑ´Ù. ÀÔ·ÂÀ¸·Î ¹®Àå ³» °¢ ´Ü¾îÀÇ ÀÓº£µù º¤ÅÍ¿Í »ç°Ç ´Ü¾î Á¤º¸°¡ Á¦°øµÆÀ» ¶§, Á¦¾ÈÇÑ ¹æ¹ýÀº »ç°Ç ´Ü¾î¿¡ ÁÖÀÇ ÁýÁßÇÒ ¼ö ÀÖµµ·Ï »ç°Ç ´Ü¾î Á¤º¸¸¦ »ç¿ëÇÏ¿© ÁÖÀÇÁýÁß °¡ÁßÄ¡¸¦ °è»êÇÏ°í, ÀÌ °¡ÁßÄ¡´Â ±âÁ¸ÀÇ ¸ðµ¨¿¡ °áÇÕÇÏ¿© »ç¿ëµÈ´Ù. ½ÇÇèÀº ¿µ¾î¿Í Çѱ¹¾î µ¥ÀÌÅÍ ¼Â¿¡¼­ ¼öÇàµÇ¾úÀ¸¸ç, ±âÁ¸ ¸ðµ¨¿¡ Á¦¾ÈÇÑ ¹æ¹ýÀ» °áÇÕÇÏ¿© Æò°¡¸¦ ¼öÇàÇÏ¿´´Ù. ½ÇÇè °á°ú, ±âÁ¸ ¸ðµ¨º¸´Ù Á¦¾ÈÇÑ ¹æ¹ýÀ» Àû¿ëÇÑ ¸ðµ¨ÀÌ ³ôÀº ¼º´ÉÀ» ¾ò¾î, Á¦¾ÈÇÑ ¹æ¹ýÀÌ È¿°úÀûÀÓÀ» ÀÔÁõÇÏ¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
The purpose of summarization is to generate short text that preserves important information in the source sentences. There are two approaches for the summarization task. One is an extractive approach and other is an abstractive approach. The extractive approach is to determine if words in a source sentence are retained or not. The abstractive approach generates the summary of a given source sentence using the neural network such as the sequence-to-sequence model and the pointer-generator. However, these approaches present a problem because such approaches omit important information such as event words. This paper proposes an event word attention mechanism for sentence summarization. Event words serve as the key meaning of a given source sentence, since they express what occurs in the source sentence. The event word attention weights are calculated by event information of each words in the source sentence and then it combines global attention mechanism. For evaluation, we used the English and Korean dataset. Experimental results show that, the model of adopting event attention outperforms the existing models.
Å°¿öµå(Keyword) ¹®Àå ¿ä¾à   »ç°Ç ´Ü¾î   ½ÃÄö½º-Åõ-½ÃÄö½º   ÁÖÀÇÁýÁß ¸ÞÄ¿´ÏÁò   »ç°Ç ´Ü¾î ÁÖÀÇÁýÁß ¸ÞÄ¿´ÏÁò   sentence summarization   event word   sequence-to-sequence   attention mechanism   event word attention mechanism  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå